×
BE SO GOOD THEY CAN'T IGNORE YOU
--Your friends at LectureNotes
Close

Probability

by Placement Factory
Type: NoteCourse: Placement Preparation Specialization: Quantitative AptitudeViews: 28Uploaded: 25 days agoAdd to Favourite

Share it with your friends

Suggested Materials

Leave your Comments

Contributors

Placement Factory
Placement Factory
Notes on Probability Peter J. Cameron
Contents 1 2 3 Basic ideas 1.1 Sample space, events . . . . . . 1.2 What is probability? . . . . . . . 1.3 Kolmogorov’s Axioms . . . . . 1.4 Proving things from the axioms . 1.5 Inclusion-Exclusion Principle . . 1.6 Other results about sets . . . . . 1.7 Sampling . . . . . . . . . . . . 1.8 Stopping rules . . . . . . . . . . 1.9 Questionnaire results . . . . . . 1.10 Independence . . . . . . . . . . 1.11 Mutual independence . . . . . . 1.12 Properties of independence . . . 1.13 Worked examples . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 1 1 3 3 4 6 7 8 12 13 14 16 17 20 Conditional probability 2.1 What is conditional probability? . 2.2 Genetics . . . . . . . . . . . . . . 2.3 The Theorem of Total Probability 2.4 Sampling revisited . . . . . . . . 2.5 Bayes’ Theorem . . . . . . . . . . 2.6 Iterated conditional probability . . 2.7 Worked examples . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 23 23 25 26 28 29 31 34 Random variables 3.1 What are random variables? . . . . 3.2 Probability mass function . . . . . . 3.3 Expected value and variance . . . . 3.4 Joint p.m.f. of two random variables 3.5 Some discrete random variables . . 3.6 Continuous random variables . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 39 39 40 41 43 47 55 vii
viii CONTENTS 3.7 3.8 3.9 3.10 4 Median, quartiles, percentiles . . . Some continuous random variables On using tables . . . . . . . . . . Worked examples . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 57 . 58 . 61 . 63 More on joint distribution 4.1 Covariance and correlation . . . . . 4.2 Conditional random variables . . . . 4.3 Joint distribution of continuous r.v.s 4.4 Transformation of random variables 4.5 Worked examples . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 67 67 70 73 74 77 A Mathematical notation 79 B Probability and random variables 83
Chapter 1 Basic ideas In this chapter, we don’t really answer the question ‘What is probability?’ Nobody has a really good answer to this question. We take a mathematical approach, writing down some basic axioms which probability must satisfy, and making deductions from these. We also look at different kinds of sampling, and examine what it means for events to be independent. 1.1 Sample space, events The general setting is: We perform an experiment which can have a number of different outcomes. The sample space is the set of all possible outcomes of the experiment. We usually call it S . It is important to be able to list the outcomes clearly. For example, if I plant ten bean seeds and count the number that germinate, the sample space is S = {0, 1, 2, 3, 4, 5, 6, 7, 8, 9, 10}. If I toss a coin three times and record the result, the sample space is S = {HHH, HHT, HT H, HT T, T HH, T HT, T T H, T T T }, where (for example) HT H means ‘heads on the first toss, then tails, then heads again’. Sometimes we can assume that all the outcomes are equally likely. (Don’t assume this unless either you are told to, or there is some physical reason for assuming it. In the beans example, it is most unlikely. In the coins example, the assumption will hold if the coin is ‘fair’: this means that there is no physical reason for it to favour one side over the other.) If all outcomes are equally likely, then each has probability 1/|S |. (Remember that |S | is the number of elements in the set S ). 1

Lecture Notes