Don’t stop when you’re tired. STOP when you are DONE.
--Your friends at LectureNotes

Note for Probability Theory and Stochastic Processes - PTSP By Ramanjaneya Reddy G

  • Probability Theory and Stochastic Processes - PTSP
  • Note
  • 80 Offline Downloads
  • Uploaded 1 year ago
Ramanjaneya Reddy G
Ramanjaneya Reddy G
0 User(s)
Download PDFOrder Printed Copy

Share it with your friends

Leave your Comments

Text from page-1

PROBABILITY THEORY AND STOCHASTIC PROCESS OBJECTIVES: 1. To provide mathematical background and sufficient experience so that student can read, write and understand sentences in the language of probability theory. 2. To introduce students to the basic methodology of ―probabilistic thinking‖ and apply it to problems. 3. To understand basic concepts of Probability theory and Random Variables, how to deal with multiple Random Variables. 4. To understand the difference between time averages statistical averages. 5. To teach students how to apply sums and integrals to compute probabilities, and expectations. UNIT I: Probability and Random Variable Probability: Set theory, Experiments and Sample Spaces, Discrete and Continuous Sample Spaces, Events, Probability Definitions and Axioms, Mathematical Model of Experiments, Joint Probability, Conditional Probability, Total Probability, Bayes‘ Theorem, and Independent Events, Bernoulli‘s trials. The Random Variable: Definition of a Random Variable, Conditions for a Function to be a Random Variable, Discrete and Continuous, Mixed Random Variable UNIT II: Distribution and density functions and Operations on One Random Variable Distribution and density functions: Distribution and Density functions, Properties, Binomial, Poisson, Uniform, Exponential Gaussian, Rayleigh and Conditional Distribution, Methods of defining Conditioning Event, Conditional Density function and its properties, problems. Operation on One Random Variable: Expected value of a random variable, function of a random variable, moments about the origin, central moments, variance and skew, characteristic function, moment generating function, transformations of a random variable, monotonic transformations for a continuous random variable, non monotonic transformations of continuous random variable, transformations of Discrete random variable UNIT III: Multiple Random Variables and Operations on Multiple Random Variables Multiple Random Variables: Vector Random Variables, Joint Distribution Function and Properties, Joint density Function and Properties, Marginal Distribution and density Functions, conditional Distribution and density Functions, Statistical Independence, Distribution and density functions of Sum of Two Random Variables and Sum of Several Random Variables, Central Limit Theorem - Unequal Distribution, Equal Distributions Operations on Multiple Random Variables: Expected Value of a Function of Random Variables, Joint Moments about the Origin, Joint Central Moments, Joint Characteristic Functions, and Jointly Gaussian Random Variables: Two Random Variables case and N Random Variable case, Properties, Transformations of Multiple Random Variables UNIT VI: Stochastic Processes-Temporal Characteristics: The Stochastic process Concept, Classification of Processes, Deterministic and Nondeterministic Processes, Distribution and Density Functions, Statistical Independence and concept of Stationarity: First-Order Stationary Processes, SecondOrder and Wide-Sense Stationarity, Nth-Order and Strict-Sense Stationarity, Time Averages and 1

Text from page-2

Ergodicity, Mean-Ergodic Processes, Correlation-Ergodic Processes Autocorrelation Function and Its Properties, Cross-Correlation Function and Its Properties, Covariance Functions and its properties, Gaussian Random Processes. Linear system Response: Mean and Mean-squared value, Autocorrelation, Cross-Correlation Functions. UNIT V: Stochastic Processes-Spectral Characteristics: The Power Spectrum and its Properties, Relationship between Power Spectrum and Autocorrelation Function, the Cross-Power Density Spectrum and Properties, Relationship between Cross-Power Spectrum and Cross-Correlation Function. Spectral characteristics of system response: power density spectrum of response, cross power spectral density of input and output of a linear system TEXT BOOKS: 1. Probability, Random Variables & Random Signal Principles -Peyton Z. Peebles, TMH, 4th Edition, 2001. 2. Probability and Random Processes-Scott Miller, Donald Childers,2Ed,Elsevier,2012 REFERENCE BOOKS: 1. Theory of probability and Stochastic Processes-Pradip Kumar Gosh, University Press 2. Probability and Random Processes with Application to Signal Processing - Henry Stark and John W. Woods, Pearson Education, 3rd Edition. 3. Probability Methods of Signal and System Analysis- George R. Cooper, Clave D. MC Gillem, Oxford, 3rd Edition, 1999. 4. Statistical Theory of Communication -S.P. Eugene Xavier, New Age Publications 2003 5. Probability, Random Variables and Stochastic Processes Athanasios Papoulis and S.Unnikrishna Pillai, PHI, 4th Edition, 2002. OUTCOMES: Upon completion of the subject, students will be able to compute: 1. Simple probabilities using an appropriate sample space. 2. Simple probabilities and expectations from probability density functions (pdfs) 3. Likelihood ratio tests from pdfs for statistical engineering problems. 4. Least -square & maximum likelihood estimators for engineering problems. 5. Mean and covariance functions for simple random processes. 2

Text from page-3

UNIT – 1 PROBABILITY AND RANDOM VARIABLE PROBABILITY Introduction It is remarkable that a science which began with the consideration of games of chance should have become the most important object of human knowledge. A brief history Probability has an amazing history. A practical gambling problem faced by the French nobleman Chevalier de Méré sparked the idea of probability in the mind of Blaise Pascal (1623-1662), the famous French mathematician. Pascal's correspondence with Pierre de Fermat (1601-1665), another French Mathematician in the form of seven letters in 1654 is regarded as the genesis of probability. Early mathematicians like Jacob Bernoulli (1654-1705), Abraham de Moivre (1667-1754), Thomas Bayes (1702-1761) and Pierre Simon De Laplace (1749-1827) contributed to the development of probability. Laplace's Theory Analytique des Probabilities gave comprehensive tools to calculate probabilities based on the principles of permutations and combinations. Laplace also said, "Probability theory is nothing but common sense reduced to calculation." Later mathematicians like Chebyshev (1821-1894), Markov (1856-1922), von Mises (18831953), Norbert Wiener (1894-1964) and Kolmogorov (1903-1987) contributed to new developments. Over the last four centuries and a half, probability has grown to be one of the most essential mathematical tools applied in diverse fields like economics, commerce, physical sciences, biological sciences and engineering. It is particularly important for solving practical electrical-engineering problems in communication, signal processing and computers. Notwithstanding the above developments, a precise definition of probability eluded the mathematicians for centuries. Kolmogorov in 1933 gave the axiomatic definition of probability and resolved the problem. Randomness arises because of ➢ random nature of the generation mechanism ➢ Limited understanding of the signal dynamics inherent imprecision in measurement, observation, etc. For example, thermal noise appearing in an electronic device is generated due to random motion of electrons. We have deterministic model for weather prediction; it takes into account of the factors affecting weather. We can locally predict the temperature or the rainfall of a place on the basis of previous data. Probabilistic models are established from observation of a random phenomenon. While probability is concerned with analysis of a random phenomenon, statistics help in building such models from data. 3

Text from page-4

Deterministic versus probabilistic models A deterministic model can be used for a physical quantity and the process generating it provided sufficient information is available about the initial state and the dynamics of the process generating the physical quantity. For example, • We can determine the position of a particle moving under a constant force if we know the initial position of the particle and the magnitude and the direction of the force. • We can determine the current in a circuit consisting of resistance, inductance and capacitance for a known voltage source applying Kirchoff's laws. Many of the physical quantities are random in the sense that these quantities cannot be predicted with certainty and can be described in terms of probabilistic models only. For example, • The outcome of the tossing of a coin cannot be predicted with certainty. Thus the outcome of tossing a coin is random. • The number of ones and zeros in a packet of binary data arriving through a communication channel cannot be precisely predicted is random. • The ubiquitous noise corrupting the signal during acquisition, storage and transmission can be modelled only through statistical analysis. How to Interpret Probability Mathematically, the probability that an event will occur is expressed as a number between 0 and 1. Notationally, the probability of event A is represented by P (A). ▪ If P (A) equals zero, event A will almost definitely not occur. ▪ If P (A) is close to zero, there is only a small chance that event A will occur. ▪ If P (A) equals 0.5, there is a 50-50 chance that event A will occur. ▪ If P(A) is close to one, there is a strong chance that event A will occur. ▪ If P(A) equals one, event A will almost definitely occur. In a statistical experiment, the sum of probabilities for all possible outcomes is equal to one. This means, for example, that if an experiment can have three possible outcomes (A, B, and C), then P(A) + P(B) + P(C) = 1. 4

Lecture Notes