Nothing Special   »   [go: up one dir, main page]

Discover millions of ebooks, audiobooks, and so much more with a free trial

Only $11.99/month after trial. Cancel anytime.

Probability Theory and Stochastic Processes
Probability Theory and Stochastic Processes
Probability Theory and Stochastic Processes
Ebook825 pages3 hours

Probability Theory and Stochastic Processes

Rating: 0 out of 5 stars

()

Read preview

About this ebook

The focus of this book is to understand the concepts of  probability theory and stochastic processes and the techniques are exposed to analyze the situations arise from time to time in the academic and advanced fields of Science and Technology. It aims at acquainting the reader with the mainstream of present day of thinking. This book is useful for students at both the undergraduate and the post graduate levels. This book covers the major parts of the syllabi of maximum number of universities in India and also useful for competitive examinations. Probability Theory and Stochastic Processes offers students and instructors thorough coverage of syllabus backed by solid theory and proper applications. The structure of the textbook encourages and supports completion of an in-depth study. 
Salient features
· Probability Theory and Stochastic Processes is rich in presentation and the problem setting  are realistic, stimulating and challenging
· Vast and rich collection of problems
· Simple and clear cut explanation of concepts with examples
· Quiz Questions
· Exercises Questions
· Review Quiz Questions
· Review Exercises Questions
· Important formulae, theorem, remarks and some rules are emphasized
· Comprehensive coverage, applied approach and lucid presentation
· Reliance and application areas are indicated in the problems
· University examination problems are solved
LanguageEnglish
PublisherBSP BOOKS
Release dateNov 21, 2019
ISBN9789386211927
Probability Theory and Stochastic Processes

Related to Probability Theory and Stochastic Processes

Related ebooks

Mathematics For You

View More

Related articles

Reviews for Probability Theory and Stochastic Processes

Rating: 0 out of 5 stars
0 ratings

0 ratings0 reviews

What did you think?

Tap to rate

Review must be at least 10 words

    Book preview

    Probability Theory and Stochastic Processes - B. Prabhakara Rao

    Bibliography

    CHAPTER 1

    Probability

    1.1 Introduction

    In everyday life we come across some statements like he is probably wrong, the chances of his winning the match (game) is fifty fifty, it is very likely that it will rain tonight etc. All these statements are not mathematically precise, in the sense that we cannot form any definite idea about the occurrence or non-occurrence of the events. But they give an idea of a varying degree of probability of occurrence or non-occurrence of the events.

    So a probability is a quantitative measure of uncertainty a number that conveys the strength of our belief in the occurrence of an uncertain event. Since life is full of uncertainty, people have always been interested in evaluating probabilities. The Statistician I. J. Good, suggests in his Kinds of Probability that the Theory of probability is much older than the human species.

    While the ultimate objective of studying this subject is to facilitate calculation of probabilities in business, management, science, technology, etc., the specific objectives of this chapter are to understand the following terms.

    1.2 Set Definitions

    A set is a collection of objects (e.g., numbers, alphabets in English, voltages, cars, rivers, people, etc., are anything). These objects are called elements or members of the set. A set is generally denoted by capital letters and the elements in the set denoted by lower case letters. There are two methods of designating a set. One is Roster (or tabular) method, in which the elements of a set are listed within braces, e.g., the set A of natural numbers represented asri = {1, 2, 3 . . .}.

    The other method is Property (Selector or Rule) method wherein the elements of a set are described by their common characteristics. For example, if the set A be the set of all rivers in India, it would be represented as A = {x: x is a river in India} or A = {x /x is a river in India}, where x is an arbitrary element of A.

    A set is said to be countable if its elements can be put in one-to-one correspondence with the natural numbers, which are the integers 1, 2, 3, etc. If a set is not countable, it is called uncountable.

    A set is finite if it contains a finite number of different elements, i.e. if the process of counting the different elements of the set comes to an end. Otherwise a set is infinite, i.e., if the process of counting the different elements would never come to an end.

    Example: 1.      A = {1, 2, 3, 4, 5, 6} is a finite set

    2.      A = {1, 2, 3 . . .} is an infinite set

    Equal Sets

    Two sets A and B are said to be equal, if they contain the same elements, i.e. if and only if every element of A is an element of B.

    Symbolically, A = B iff (a ∈ A ⇒ a ∈ b ^ b ∈B ⇒ b ∈ A)

    Example: A = {1, 2, 3, 4, 5, 6} and B = {1, 2, 3, 4, 5, 6} then the two sets A and B are equal.

    Null Set or Empty Set or Void Set

    A set is said to be empty if it has no elements. It is denoted by ϕ and this set is always regarded as a subset of every set. For example the set of positive integers between 1 and 2 is a null set, etc.

    Singleton Set

    A set contains only one element in it is called a Singleton set or a Unit set. For example A = {3} is a singleton set.

    Subsets

    If A and B are two sets such that every element of A is also an element of B, then A is called a subset of B (or A is contained in B) and we can write A ⊂ B If A and B are equal sets then A ⊆ B or BA, If A ⊆ B then a ∈ A ⇒ a ∈ B

    Example: The set A = {1, 2, 4} is a subset of 5 = {1, 2, 3, 4, 5, 8}

    A set A is said to be proper subset of B if (i) A is a subset of B, ie., every element of A is also an element of B and (ii) A ≠ B, i.e., there is an element in B which is not in A.Therefore A is a proper subset of B is denoted by AB. In this case B is called a superset of A.

    Example: The set A = {1, 2, 4} is a proper subset of B = {1, 2, 3, 4, 5, 8}, and B is a superset of A.

    The set 4 = {1, 2, 4} is a (Improper) subset of B = {2, 4, 1}

    Universal Set

    Universal set is the set which contains all the elements of all subsets under investigation in a particular context. We denote this set by U or S.

    e.g., In throwing a die, we get the numbers 1, 2, 3, 4, 5, 6. Here the universal set is 5= {1,2, 3, 4, 5,8}

    Power Set

    The set of all possible subsets of a set A is called the power set of A and it is denoted by the symbol P (A). If a finite set 4 has n elements, its power set contains 2n elements.

    Example: If A= {1,2} then P(A) = {ϕ, {1}, {2}, {1,2}}, where ϕ is the null set.

    1.3 Operations on Sets

    Venn diagram

    Operations on sets or theorem relating to sets can be well understood with the help of Venn diagram. In this diagram the Universal set 5 is denoted by a rectangular region and any subset of S be a region enclosed by a closed curve (or a circle) lying within the rectangular region.

    Union of Sets

    The union of two sets A and B denoted as A ∪ B , as the set of all elements which belong either to A or to B or to both A and B.

    Symbolically .A ∪ B = {x:x ∈ A ∪ x ∈ B}

    In Fig. 1.1., shows set A = {1,2, 7} and B = {1, 2, 3, 4, 5, 8} then

    A ∪ B = {l,2,3,4,5, 7,8}

    Fig. 1.1

    Intersection of Sets

    The intersection of two sets A and B denoted as A B, as the set of all elements which are common to both A and B.

    Symbolically A B = {x: x e A x B}

    In Fig. 1.2, shows set A = {1, 2, 7} and B = {1, 2, 3, 4, 5, 7, 8} then A ∩ B = {1, 2}

    Fig. 1.2

    Disjoint of Sets

    If two sets A and B have no elements in common, i.e. if no element of A is in B and no element of B is in A, then A and B are said to be disjoint or mutually exclusive sets. Clearly AB = ϕ

    Fig. 1.3 (a)

    In Fig. 1.3(b) shows set A = {1, 2, 6} and B = {3, 4, 5, 7, 8} are disjoint, since they have no common elements then A B = { ϕ}

    Fig. 1.3 (b)

    Difference of Sets

    The difference of two sets A and B is the set of elements which belong to A but which do not belong to B. We denote the difference of A and B by A-B or A ~ B

    Symbolically, A ~ B = {x: x A x B}

    In Fig. 1.4, set A = {1, 2, 3, 4, 5, 6} and B = {2, 3, 4, 5, 7, 8, 9} then

    A~ B = {1, 6} and B ~ A= {7, 8, 9}

    Then A ~ B = [1,6] and B ~ A = [7, 8, 9]

    Fig. 1.4

    Complement of a Set (or Negation of a Set)

    The complement of a set denoted by A, is the set of all elements not in A. Thus A̅ =S - A

    Since      ϕ̅ = S; S̅ = ϕ; A ∪ A̅=S; and A ∩ Α̅ = ϕ

    1.4 Laws of Algebra of Sets

    There are three main operations on sets, viz, intersection, union and complement satisfy certain laws of algebra.

    1.   Commutative Law states that for a pair of sets A and B,

    2.   Distributive Law for any three sets A, B and C we have

    3.   Associative Law for any three sets A, B and C we have

    4.   Idempotent Law for any set A, we have

    A ∪ A = A and A ∩ A = A

    5.   Identity Law for any set A and ϕ is a null set, we have

    A ∪ ϕ = A and Α̅ ϕ = ϕ

    6.   Complement Law for any set A and A/ is complement of A , we have

    7.   De Morgan’s Law for any two sets A and B, we have

    1.5 Terminology in Probability

    Random Experiment: The term experiment refers to describe an act which can be repeated under some given conditions. The experiment whose results (output) depends on chance are called Random Experiment. For example tossing of a coin is a random experiment, and similarly throwing a die is a random experiment etc.,

    Event: The output or result of a random experiment is called an event or result or outcome. For example in tossing of a coin, getting head or tail is an event and similarly in throwing a die getting 1 or 2 or 3 or 4 or 5 or 6 is event

    Events are generally denoted by capital letters A, B, C, etc. The events can be splitted in to two types. One is simple event and the other is compound event. For example in tossing of a coin, getting head or tail is a simple event and similarly in throwing two dice getting a sum of 6 points is a complex or compound event. This can be splitted as (1, 5), (2, 4), (3, 3), (4, 2) and (5, 1). Each is a simple event. So Compound events can be splitted furtherly as simple events.

    Mutually Exclusive Events: Two or more events are said to be mutually exclusive events if the occurrence of one event precludes (excludes or prevent) the occurrence of others, i.e., both the events cannot happen simultaneously in a single trail. For example, a person may be either alive or dead at a point of time he cannot be both alive as well as dead at the same time. Similarly, in tossing of a coin, the events head and tail are mutually exclusive and in throwing a die, all the six faces are mutually exclusive.

    Equally Likely Events: Two or more events are said to be equally likely, if there is no reason to expect any one case (or any event) in preference to others. That is every outcome of the experiment has equally likely possible of occurrence are called equally likely events. For example in tossing of a coin, the events head and tail are equally likely and similarly in throwing a die, all the six faces (events) are equally likely.

    Exhaustive number of Cases or Events: The total number of possible outcomes in an experiment is called exhaustive number of cases or events. For example in tossing of a coin, the exhaustive number of cases are two (i.e., Head and Tail) and similarly in throwing a die, the exhaustive number of cases are six (i.e., 1, 2, 3, 4, 5 and 6).

    At Random: Means without giving any preference or priority to any case or event. For example drawing a card from a well shuffled pack of cards at random, this may be any card and similarly asking questions to students at random in a class room.

    Independent and Dependent Events'. Two or more events are said to be independent if the occurrence of one does not affect the occurrence of other(s). For example if a coin is tossed twice, the result of the second throw would in no way be affected by the result of the first throw.

    Dependent events are those in which the occurrence or non-occurrence of one event in any trail affects the probability of other events in other trials. For example if a card is drawn from a pack of playing cards and is not replaced in the deck of cards, this will alter the probability that the second card drawn is affected.

    Sample Space: The set of all possible outcomes of a random experiment is called a sample space. It is denoted by S. Each outcome in the experiment is called sample point. There are two types of sample spaces. One is finite sample space in which the number of sample points is finite and the other is infinite sample space in which the number of sample points are infinite.

    For example in tossing of a coin the sample space S = {Η, T} is discrete and finite and similarly in throwing a die, the sample space S = {1, 2, 3, 4, 5, 6} is discrete and finite sample space

    For countable discrete and infinite sample points, S is the experiment ‘choose randomly a positive integer’ is the countably infinite set 5 = {1, 2, 3 . . .}

    A sample space is said to be discrete if it has only finitely many or countable infinite number of points which can be arranged in a simple sequence e1, e2, . . .while a sample space containing non-denumerable number of points is called a continuous sample space. All the points on a line, or a line segment, or all the points in a plane are the examples of continuous sample space.

    We shall restrict ourselves only to discrete sample space

    1.6 Classical Definition of Probability

    Suppose that an event E can happen in m ways and non- happen in n ways all these m + n ways are supposed equally likely and finite. Then the probability of happening of the event called its success, denoted by P (E) or simply p and is defined as

    and the probability of non-happening of the event called its failure, denoted by P(E)̅ or simply q and is defined as

    From eqns (1.1) and (1.2), we observe that the probability of an event can be defined as

    It follows that

    or p + q = 1. This implies that p = 1-q or q = 1- p Hence 0 ≤ P(E) ≤ 1 and 0 ≤ P(E̅) ≤ 1

    If P(E) = 1, then the event E is called certain event. For example, the death of human being or animal is certain, and then the probability of death of human being or animal is one.

    If P(E) = 0, then the event E is called an impossible event. For example, in a women college, finding a boy student in a class room is impossible event and the probability of finding a boy student is zero and swimming in air is impossible, then the probability of swimming in air is zero.

    Odds in favour of the event is the ratio of the probability of success to failure to the event in terms of success i.e., m : n or p : q or the ratio of number of favourable cases to the number of non-favourable cases to the event

    Odds against the event is the ratio of the probability of failure to success to the event in terms of success i.e., n : m or q : p or the ratio number of non-favourable cases to the number of favourable cases to the event

    Limitations of Classical definition of probability

    The classical definition of probability is fails in the following cases. If the

    1.7 Probability as a Relative Frequency or Statistical (Von-Mises) Definition of Probability

    Let a trail be repeated a large number of times under essentially identical conditions and let E be the event of it. The ratio of the number of times (m) the event E is called the relative frequency of the event E and is denoted by R(E) and the probability of the event E

    If such trail is repeated N times and if the probability of success of an event E is P(E) then the total number of trails favourable to E is N.P(E), this product is called Mathematical Expectation.

    Limitations of Statistical definition ofprobability

    If the limit does not exist then the statistical definition of probability fails.

    1.8 Axiomatic Definition of Probability

    The axiomatic approach to probability closely relates to the theory of probability with the set theory. The axiomatic definition of probability includes both the Classical and Statistical definitions as particular cases and overcomes the deficiencies of each of them. The axioms thus provides a set of rules, the rules can be used to deduce theorems and the theorems can be brought together to deduce more complex theorems.

    1.9 Axioms of Probability (Probability Function of the Sample Space)

    Let 5' be the sample space and let E be any event in 5 (i.e., E c S) we define a function P on 5' (for any event E the functional value E(E)) is a real number such that

    1.10 Usual Probability Function

    In a sample space S with N sample points and E is an event of S. Let the probability of occurrence of the event E, is P(E) be defined as

    Now we observe that

    Since the probability function obeys the axioms of probability law is called usual probability function.

    We will explain the mathematical model of experiments with the following examples.

    Example 1: An experiment consists of observing the number of heads when tossing 4 fair coins. We develop a model for this experiment.

    The sample space S consists of

    Let X denote the number of heads in a single toss of 4 fair coins.

    Given X represents the number of heads when tossing 4 fair coins. Therefore the range of AGs {4,3,2, 1,0}

    Let A be the event that X < 2, i.e., A = {the number of heads < 2} and B be the event that 1 < X < 3, i.e., B = {1 < the number of heads <3}

    Now we assign the probabilities for these events as

    Example 2: An experiment consists of observing the sum of the numbers on the faces of two dice when two dice are thrown. We develop a model for this experiment.

    When two dice are thrown then the sample space S is

    From the given data,

    X= {2, 3, 4, 5, 6, 7, 8, 9, 10, 11, 12} represents sum of the variables on the faces.

    Suppose we are mainly interested in three events defined by

    Assigning the probabilities to these events,

    (i)   A = {Sum on the two dice is 5}

    i.e.,      P(A) = P(5) = P(X=5)

    = P{(1,4) (2,3) (3,2), (4,1)}

    Therefore P(A)

    (ii)   B = {4 ≤ Sum on the two dice <7}

    i.e.,      P(B) = P(4 ≤ X<7)

    = P(X = 4) + P(X= 5) + P(X= 6)

    = P{(1,3) (2,2) (3,1)}

       + P{(1,4) (2,3) (3,2),(4,1)} + P{(1,5) (2,4) (3,3),(4,2), (5,1)}

    Therefore P(B)

    (iii)   C = {Sum on the two dice >10}

    This is the mathematical model of the experiment.

    1.11 Some Theorems on Probability

    1.      Probability of an impossible event is zero i.e., P( ϕ) = 0

    Proof: Since we know that S ∪ φ = S

    Now      - P(S ∪ φ) = P(S)

    P(S) +Ρ(φ(φ= P(S)

    ·.· P(S) =1 and S and ϕ are mutually exclusive events

    ·.·Ρ(ϕ) = 0

    Hence probability of an impossible event is zero i.e., P( ϕ) = 0

    2.      If A is any event in a sample space S and A' is the complementary event of A then P(A ) = 1-P(A) or P(A) = 1-P(A)

    Proof:   Since we know that S =A A'

    Now      P(S) = P(A ∪ A)

    ·.·1 = P(A) + P(A')

    ·.·P(S) = 1 and A and A'are mutually exclusive events

    ·.·P(A') -1 -P(A) or P(A) = 1-P(A')

    Hence, if A is any event in a sample space S and A' is the complementary event of A then P(A') = 1-P(A) or P(A) = 1- P(A') .

    3.      If A is any event in the finite sample space, then prove that P(A) equals to the sum of the probabilities of the individual outcomes comprising A.

    Proof:

    Let A be any event in the finite sample space which comprises individual outcomes A1, A2, A3, . . . . An (which are mutually exclusive events)

    A = A1 A2 A3 . . . . An

    P(A) = P(A1 ∪ A2 ∪ A3 ∪ . . . . ∪ An)

    P(A) = P(A) 1 P( A2) + P( A3) + . . . .P(An)

    i.e., If A is any event in the finite sample space, then P(A) equals to the sum of the probabilities of the individual outcomes comprising A.

    4.      Probability of any event in a sample space containing equally likely simple events is same

    Proof:

    Let S be sample space which comprises individual outcomes A1, A2, A3, . . . . An (which are mutually exclusive events)

    S = A1 ∪ A2 ∪ A3 ∪ . . . . ∪ An

    P(S) = P(A1 ∪ A2 ∪ A3 ∪ . . . . ∪ An)

    P(S) = P(A) 1 + P( A2) + P( A3) + . . . .P(An)

    1 = P(A1) + P( A2) + P( A3) + . . . .P(A„) ·.· P(S) = 1

    Let      A1 = A2 = A3 = . . . = An = A

    1 = P(A) + P( A) + P( A) + . . . .P(A)

    1 = n.P(A)

    P(A) — -

    This implies that

    ∴ Probability of any event in a sample space containing equally likely simple events is same.

    1.12 Joint Probability

    In some experiments, events are not mutually exclusive because of some common elements in the sample space. The probability of occurrence of these common elements in the sample space is called joint probability.

    Theorem: Addition theorem on Probability (or Joint Probability):

    P(A ∪ B ∪ C) - P(A) + P(B) + P(C) - P(A ∩ B) - P(B ∩ C)- P(C ∩ A) + P(A ∩ B ∩ C)

    Proof:

    Fig. 1.5

    From the Fig. 1.5,

    (A ∪ B) = (AnB) (A ∩ B)v(A' ∩ B)

    Here (A ∩ B'), (A ∩ B) and (A ∩ B) are mutually exclusive events

    P(A∪B) = P{(A∩B')∪ (A B) (A' B)}

    = P(A ∩B) + P (A∩B) + P(A' B)

    = P(A∩ B) + (A∩B) + (A' ∩B) +P(A∩B) - (A∩B)

    ∴P(A ∪ B) = P(A) + P(B) - P(A∩B) .....(1.3)

    Since we know that from the addition theorem on probability for any two events A and B then P(A ∪ B) - P(A) + P(B) - P(A B) we are extending for any three events A, B, and C

    P[(A∪B)∪C] = P(A∪B) + P(C)-P[(A∪B)∩C] from(1.3) .....(1.4)

    P[(A∪B)∪C] = P(A) + P(B) - P(A∩B) + P(C) - P[(A∩C)∪(B ∩C)]

    = P(A) +P(B) -P(A∩B) + P(C) - [P(A∩C) +P(B∩C) -P(A∩B∩C)] from (1.3)

    = P(A) +P(B) + P(C) - P(A∩B) -P(B∩C) - P(C∩A) +P(A∩B∩C)

    ∴P(A∪B∪C) = P(A) + P(B) + P(C)-P(A∩B)- P(B ∩C) - P(C ∩ A) + P(A∩B ∩ C)

    6.   If A⊆B then prove that (i) P(A'∩B) = P(B)-P(A) and (ii) P(A)≤ P(B) or If A and B are two events of a sample space S such that .A ⊆ B then P(B-A) = P(B)-P(A)

    Proof:

    From the given data A⊆B

    from the Venn diagram

    Fig. 1.6

    P(A 'nB) = P(B) - P(A B)

    = P(B)-P(A) -∵ AnB = A .....(1.5)

    Also since P(A'∩B) -P(B - A)

    ∴ P(B-A) -P(B) - P(A)

    (ii)      Since,

    P(B) - P(A) = P(A' ∩B) ≥0

    This implies that P(B)≥ P(A) i.e., P(A) ≤ P(B)

    7.   If A and B are mutually exclusive events, then prove that P(A) ≤ P(B')

    Proof:

    Given A and B are mutually exclusive events, then

    P(A B) = 0      ∵ A B = φ

    Now by addition theorem on probability states that for any two mutually exclusive events

    ∴P(A∪B) -P(A) + P(B) ≤ 1      ∵0< P(A∪B)<1

    P(A) +P(B)≤1 ⇒ P(A)≤ P(B) ≤ P(B)

    Enjoying the preview?
    Page 1 of 1