×
STUDY HARD SO YOU CAN LIVE YOUR LIFE AS YOU WANT.
--Your friends at LectureNotes
Close

Note for Probability and Statistics - PS by bhasakar naik

  • Probability and Statistics - PS
  • Note
  • Sree vidyanikethan engineering college -
  • Computer Science Engineering
  • B.Tech
  • 2 Topics
  • 34 Views
  • Uploaded 11 months ago
Bhasakar Naik
Bhasakar Naik
0 User(s)
Download PDFOrder Printed Copy

Share it with your friends

Leave your Comments

Text from page-2

Chapter IV: Probability Distributions and Their Applications items (the p 's) among 5 items (the p 's and q 's). Therefore the total number of terms ⎛5⎞ is ⎜⎜ ⎟⎟ or 10 so that the probability of exactly 2 exceedances in 5 years is 10 p 2 q 3 . ⎝ 2⎠ This result can be generalized so that the probability of X = x exceedances in n ⎛n ⎞ years is ⎜⎜ ⎟⎟ p X q n − X . The result is applicable to any Bernoulli process so that the ⎝X⎠ probability of X = x occurrences of an event in n independent trials if p is the probability of an occurrence in a single trial is given by: ⎛n⎞ f X ( x : n, p ) = ⎜⎜ ⎟⎟ p x q n − x ⎝ x⎠ x = 0,1,2, K , n This equation is known as the binomial distribution. The binomial distribution and the Bernoulli process are not limited to a time scale. Any process that may occur with probability p at discrete points in time or space or in individual trials may be a Bernoulli process and follow the binomial distribution. The cumulative binomial distribution is X ⎛n⎞ F X ( x; n, p) = ∑i = 0 ⎜⎜ ⎟⎟ p i q n − i ⎝i ⎠ x = 0,1,2, K , n and gives the probability of x or fewer occurrences of an event in n independent trials if the probability of an occurrence in any trial is p. Continuing the above example, the probability of less than 3 exceedances in 5 years is 2 ⎛ 5⎞ FX (2;5, p ) = ∑i =0 ⎜⎜ ⎟⎟ p i q 5−i ⎝i ⎠ = f X (0;5, p) + f X (1;5, p) + f X (2;5, p) The mean and variance of the binomial distribution are E ( X ) = np var( X ) = nqp The coefficient of skew is (q − p) / npq so that the distribution is symmetrical for p = q , skewed to the right for q > p and skewed to the left for q < p . 68

Text from page-3

Chapter IV: Probability Distributions and Their Applications The binomial distribution has an additive property. That is if X has a binomial distribution with parameters n1 and p and Y has a binomial distribution with parameters n 2 and p , then Z = X + Y has a binomial distribution with parameters n = n1 + n2 and p . The binomial distribution can be used to approximate the hyper-geometric distribution if the sample selected is small in comparison to the number of items N from which sample is drawn. In this case the probability of a success would be about the same for each trial. Example: In order to be 90 percent sure that a design storm is not exceeded in a 10 year period. What should be the return period of the design storm? Solution: Let p be the probability of the design storm being exceeded. The probability of no exceedances is given by ⎛10 ⎞ f X (0;10, p) = ⎜⎜ ⎟⎟ p 0 q 10 ⎝0 ⎠ 0.90 = (1 − p)10 p = 1 − (0.90)1 / 10 = 1 − 0.9895 = 0.0105 T = 1 / p = 95 years. Comment: To be 90 percent sure that a design storm is not exceeded in a 10-year period a 95-year return storm must be used. If a 10-year return period storm is used, the chances of it being exceeded is 1 − f X (0;0,1) = 0.6513 . In general the chance of at least one occurrence of a T-year event in T-years is 1 − f X (0; T ,1 / T ) = 1 − (1 − 1 / T ) T . Therefore, for a long design life, the chance of at least one occurrence of an event with return period equal to the design life approaches 1 − 1 / e or 0.632.Thus if the design life of a structure and its design return period are the same, the chances are very great that the capacity of the structure will be exceeded during its design life. 4.1.2 Poisson distribution The Poisson distribution is like the binomial distribution in that it describes phenomena for which the average probability of an event is constant, independent of the number of previous events. In this case, however, the system undergoes transitions 69

Text from page-4

Chapter IV: Probability Distributions and Their Applications randomly from one state with n occurrences of an event to another with ( n + 1) occurrences, in a process that is irreversible. That is, the ordering of the events cannot be interchanged. Another distinction between the binomial and Poisson distributions is that for the Poisson process the number of possible events should be large. The Poisson distribution may be inferred from the identity e −µ e µ = 1 where the most probable number of occurrences of the event is µ . If the factorial is expanded in a power series expansion, the probability P(r) that exactly r random occurrences will take place can be inferred as the r th term in the series, i.e., e −µ µ r p (r ) = r! (4.1.2.1) This probability distribution leads directly to the interpretation that: e − µ = the probability that an event will not occur, µ e − µ = the probability that an event will occur exactly once, ( µ 2 / 2! ) e − µ = the probability that an event will occur exactly twice, etc, The mean and the variance of the Poisson distribution are: E( X ) = µ Var ( X ) = µ The coefficient of skew is µ −1 / 2 so that as µ gets large, the distribution goes from a positively skewed distribution to a nearly symmetrical distribution. The cumulative Poisson probability that an event will occur x times or less is: x p (≤ x ) = ∑ p ( r ) r =0 Of course, the probability that the event will occur ( x + 1) or more times would be the complement of P(x). The Poisson distribution is useful for analyzing the failure of a system that consists of a large number of identical components that, upon failure, cause irreversible transitions in the system. Each component is assumed to fail independently and randomly. Then µ is the most probable number of system failures over the life time. To summarize: 70

Text from page-5

Chapter IV: Probability Distributions and Their Applications ¾ The binomial distribution is useful for systems with two possible outcomes of events (failure–no failure) in cases where there is a known, finite number of (Bernoulli) trials and the ordering of the trials does not affect the outcome. ¾ The Poisson distribution treats systems in which randomly occurring phenomena cause irreversible transitions from one state to another. Example: A given nuclear reactor is fueled with 200 assemblies, each of which can fail if the cladding on a fuel rod fails. If each assembly fails in an independent and random manner over the exposure time, calculate the probability of 3 assemblies failing if, on the average, 1% of the fuel assemblies are known to fail. (MacCormick, 1981, p. 34) Solution: The mean number of assembly failure is µ = 2 , so using equation (4.1.2.1) for r = 3 gives P (3) = (2 3 / 3!)e −2 = 0.1804 As a check, we can use the probability of a single assembly failing, p = 0.01 , and the binomial distribution equation with n = 200 to obtain P (3) = 4.1.3 200! (0.01) 3 (0.99) 200−3 = 0.1814 3!(200 − 3)! Hyper-geometric distribution Drawing a random sample of size n (without replacement) from a finite population of size N with the elements of the population divided into two groups with k elements belonging to one group is an example of sampling from a hyper-geometric distribution. The two groups may be defective or non-defective objects, rainy or nonrainy days, success or failure of a project, etc. The total number of possible outcomes or ways of selecting a sample of size n from ⎛N⎞ N objects is ⎜⎜ ⎟⎟ . The number of ways of selecting x successes and n-x failures ⎝n ⎠ ⎛k ⎞ ⎛ N − k ⎞ ⎟⎟ . Thus from the population containing k successes and N − k failures is ⎜⎜ ⎟⎟ ⎜⎜ ⎝ x⎠ ⎝n − x ⎠ the probability is: 71

Lecture Notes